On Quantile Regression in Reproducing Kernel Hilbert Spaces with the Data Sparsity Constraint

نویسندگان

  • Chong Zhang
  • Yufeng Liu
  • Yichao Wu
چکیده

For spline regressions, it is well known that the choice of knots is crucial for the performance of the estimator. As a general learning framework covering the smoothing splines, learning in a Reproducing Kernel Hilbert Space (RKHS) has a similar issue. However, the selection of training data points for kernel functions in the RKHS representation has not been carefully studied in the literature. In this paper we study quantile regression as an example of learning in a RKHS. In this case, the regular squared norm penalty does not perform training data selection. We propose a data sparsity constraint that imposes thresholding on the kernel function coefficients to achieve a sparse kernel function representation. We demonstrate that the proposed data sparsity method can have competitive prediction performance for certain situations, and have comparable performance in other cases compared to that of the traditional squared norm penalty. Therefore, the data sparsity method can serve as a competitive alternative to the squared norm penalty method. Some theoretical properties of our proposed method using the data sparsity constraint are obtained. Both simulated and real data sets are used to demonstrate the usefulness of our data sparsity constraint.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fisher’s Linear Discriminant Analysis for Weather Data by reproducing kernel Hilbert spaces framework

Recently with science and technology development, data with functional nature are easy to collect. Hence, statistical analysis of such data is of great importance. Similar to multivariate analysis, linear combinations of random variables have a key role in functional analysis. The role of Theory of Reproducing Kernel Hilbert Spaces is very important in this content. In this paper we study a gen...

متن کامل

Some Properties of Reproducing Kernel Banach and Hilbert Spaces

This paper is devoted to the study of reproducing kernel Hilbert spaces. We focus on multipliers of reproducing kernel Banach and Hilbert spaces. In particular, we try to extend this concept and prove some related theorems. Moreover, we focus on reproducing kernels in vector-valued reproducing kernel Hilbert spaces. In particular, we extend reproducing kernels to relative reproducing kernels an...

متن کامل

Quantile Regression in Reproducing Kernel Hilbert Spaces

In this paper we consider quantile regression in reproducing kernel Hilbert spaces, which we refer to as kernel quantile regression (KQR). We make three contributions: (1) we propose an efficient algorithm that computes the entire solution path of the KQR, with essentially the same computational cost as fitting one KQR model; (2) we derive a simple formula for the effective dimension of the KQR...

متن کامل

Data sparse nonparametric regression with $ε$-insensitive losses

Leveraging the celebrated support vector regression (SVR) method, we propose a unifying framework in order to deliver regression machines in reproducing kernel Hilbert spaces (RKHSs) with data sparsity. The central point is a new definition of -insensitivity, valid for many regression losses (including quantile and expectile regression) and their multivariate extensions. We show that the dual o...

متن کامل

Approximation Analysis of Learning Algorithms for Support Vector Regression and Quantile Regression

We study learning algorithms generated by regularization schemes in reproducing kernel Hilbert spaces associated with an -insensitive pinball loss. This loss function is motivated by the -insensitive loss for support vector regression and the pinball loss for quantile regression. Approximation analysis is conducted for these algorithms by means of a variance-expectation bound when a noise condi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of machine learning research : JMLR

دوره 17 40  شماره 

صفحات  -

تاریخ انتشار 2016